34,720 research outputs found

    Flux sensing device using a tubular core with toroidal gating coil and solenoidal output coil wound thereon Patent

    Get PDF
    Flux gate magnetometer with toroidal gating coil and solenoidal output coil for signal modulation or amplificatio

    Fate of the Peak Effect in a Type-II Superconductor: Multicriticality in the Bragg-Glass Transition

    Full text link
    We have used small-angle-neutron-scattering (SANS) and ac magnetic susceptibility to investigate the global magnetic field H vs temperature T phase diagram of a single crystal Nb in which a first-order transition of Bragg-glass melting (disordering), a peak effect, and surface superconductivity are all observable. It was found that the disappearance of the peak effect is directly related to a multicritical behavior in the Bragg-glass transition. Four characteristic phase boundary lines have been identified on the H-T plane: a first-order line at high fields, a mean-field-like continuous transition line at low fields, and two continuous transition line associated with the onset of surface and bulk superconductivity. All four lines are found to meet at a multicritical point.Comment: 4 figure

    Using Fuzzy Linguistic Representations to Provide Explanatory Semantics for Data Warehouses

    Get PDF
    A data warehouse integrates large amounts of extracted and summarized data from multiple sources for direct querying and analysis. While it provides decision makers with easy access to such historical and aggregate data, the real meaning of the data has been ignored. For example, "whether a total sales amount 1,000 items indicates a good or bad sales performance" is still unclear. From the decision makers' point of view, the semantics rather than raw numbers which convey the meaning of the data is very important. In this paper, we explore the use of fuzzy technology to provide this semantics for the summarizations and aggregates developed in data warehousing systems. A three layered data warehouse semantic model, consisting of quantitative (numerical) summarization, qualitative (categorical) summarization, and quantifier summarization, is proposed for capturing and explicating the semantics of warehoused data. Based on the model, several algebraic operators are defined. We also extend the SQL language to allow for flexible queries against such enhanced data warehouses

    Gauge dilution and leptogenesis

    Get PDF
    In this paper, we examine how gauge interactions can dilute the lepton asymmetry in lepton induced baryogenesis. Constraints imposed on Majorana masses keep this dilution at an acceptable level.Comment: 5 page

    Data complexity in machine learning

    Get PDF
    We investigate the role of data complexity in the context of binary classification problems. The universal data complexity is defined for a data set as the Kolmogorov complexity of the mapping enforced by the data set. It is closely related to several existing principles used in machine learning such as Occam's razor, the minimum description length, and the Bayesian approach. The data complexity can also be defined based on a learning model, which is more realistic for applications. We demonstrate the application of the data complexity in two learning problems, data decomposition and data pruning. In data decomposition, we illustrate that a data set is best approximated by its principal subsets which are Pareto optimal with respect to the complexity and the set size. In data pruning, we show that outliers usually have high complexity contributions, and propose methods for estimating the complexity contribution. Since in practice we have to approximate the ideal data complexity measures, we also discuss the impact of such approximations
    corecore